Phase Diagram and Storage Capacity of Sequence Processing Neural Networks
نویسندگان
چکیده
Abstract. We solve the dynamics of Hopfield-type neural networks which store sequences of patterns, close to saturation. The asymmetry of the interaction matrix in such models leads to violation of detailed balance, ruling out an equilibrium statistical mechanical analysis. Using generating functional methods we derive exact closed equations for dynamical order parameters, viz. the sequence overlap and correlationand response functions, in the thermodynamic limit. We calculate the time translation invariant solutions of these equations, describing stationary limit-cycles, which leads to a phase diagram. The effective retarded self-interaction usually appearing in symmetric models is here found to vanish, which causes a significantly enlarged storage capacity of αc ∼ 0.269, compared to αc ∼ 0.139 for Hopfield networks storing static patterns. Our results are tested against extensive computer simulations and excellent agreement is found.
منابع مشابه
Phase Diagram and Storage Capacity of Sequence-Storing Neural Networks
We solve the dynamics of Hopfield–type neural networks which store sequences of patterns, close to saturation. The asymmetry of the interaction matrix in such models leads to violation of detailed balance, ruling out an equilibrium statistical mechanical analysis. Using generating functional methods we derive exact closed equations for dynamical order parameters, viz. the sequence overlap and c...
متن کاملPrediction of methanol loss by hydrocarbon gas phase in hydrate inhibition unit by back propagation neural networks
Gas hydrate often occurs in natural gas pipelines and process equipment at high pressure and low temperature. Methanol as a hydrate inhibitor injects to the potential hydrate systems and then recovers from the gas phase and re-injects to the system. Since methanol loss imposes an extra cost on the gas processing plants, designing a process for its reduction is necessary. In this study, an accur...
متن کاملOptimally adapted multi-state neural networks trained with noise
The principle of adaptation in a noisy retrieval environment is extended here to a diluted attractor neural network of Q-state neurons trained with noisy data. The network is adapted to an appropriate noisy training overlap and training activity, which are determined self-consistently by the optimized retrieval attractor overlap and activity. The optimized storage capacity and the corresponding...
متن کاملShort-term synaptic facilitation improves information retrieval in noisy neural networks
Short-term synaptic depression and facilitation have been found to greatly influence the performance of autoassociative neural networks. However, only partial results, focused for instance on the computation of the maximum storage capacity at zero temperature, have been obtained to date. In this work, we extended the study of the effect of these synaptic mechanisms on autoassociative neural net...
متن کاملImproving Phoneme Sequence Recognition using Phoneme Duration Information in DNN-HSMM
Improving phoneme recognition has attracted the attention of many researchers due to its applications in various fields of speech processing. Recent research achievements show that using deep neural network (DNN) in speech recognition systems significantly improves the performance of these systems. There are two phases in DNN-based phoneme recognition systems including training and testing. Mos...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998